Training ν-Support Vector Classifiers: Theory and Algorithms

نویسندگان

  • Chih-Chung Chang
  • Chih-Jen Lin
چکیده

The ν-support vector machine (ν-SVM) for classification proposed by Schölkopf et al. has the advantage of using a parameter ν on controlling the number of support vectors. In this paper, we investigate the relation between ν-SVM and C-SVM in detail. We show that in general they are two different problems with the same optimal solution set. Hence we may expect that many numerical aspects on solving them are similar. However, comparing to regular C-SVM, its formulation is more complicated so up to now there are no effective methods for solving large-scale ν-SVM. We propose a decomposition method for ν-SVM which is competitive with existing methods for C-SVM. We also discuss the behavior of ν-SVM by some numerical experiments.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Application of ensemble learning techniques to model the atmospheric concentration of SO2

In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...

متن کامل

Algorithms for Sparse Linear Classifiers in the Massive Data Setting

Classifiers favoring sparse solutions, such as support vector machines, relevance vector machines, LASSO-regression based classifiers, etc., provide competitive methods for classification problems in high dimensions. However, current algorithms for training sparse classifiers typically scale quite unfavorably with respect to the number of training examples. This paper proposes online and multi-...

متن کامل

Coulomb Classifiers: Generalizing Support Vector Machines via an Analogy to Electrostatic Systems

We introduce a family of classifiers based on a physical analogy to an electrostatic system of charged conductors. The family, called Coulomb classifiers, includes the two best-known support-vector machines (SVMs), the ν–SVM and the C–SVM. In the electrostatics analogy, a training example corresponds to a charged conductor at a given location in space, the classification function corresponds to...

متن کامل

The 2ν-SVM: A Cost-Sensitive Extension of the ν-SVM

Standard classification algorithms aim to minimize the probability of making an incorrect classification. In many important applications, however, some kinds of errors are more important than others. In this report we review cost-sensitive extensions of standard support vector machines (SVMs). In particular, we describe cost-sensitive extensions of the C-SVM and the ν-SVM, which we denote the 2...

متن کامل

A Random Sampling Technique for Training Support Vector Machines (For Primal-Form Maximal-Margin Classifiers)

Random sampling techniques have been developed for combinatorial optimization problems. In this note, we report an application of one of these techniques for training support vector machines (more precisely, primal-form maximal-margin classifiers) that solve two-group classification problems by using hyperplane classifiers. Through this research, we are aiming (I) to design efficient and theore...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008